Web Survey Bibliography
Many companies are faced with the question whether administering surveys to their employees via Internet technologies will lead to different results than surveying them with paper and pencil, just because of the technologies or the psychological aspects of using the technologies. In a 2x2x2x2 factorial (quasi-)experimental design it was investigated whether anonymity, voluntariness, age, and administration mode (Internet versus paper-and-pencil) would influence answering behavior in a newly designed short-format employee survey. Employees in one complete section of a large Swiss company (N=655) were randomly assigned to conditions and invited to participate in the survey. Overall, 318 employees responded, with lower return rates for anonymous and for voluntary conditions: anonymous & voluntary 31% return vs. less anonymous & less voluntary 58% return, with other conditions in-between. There was also a tendency for differences in dropout. Voluntariness had a significant effect on response times, a dependent measure that could be applied in the Internet condition only. Age and anonymity were shown to have significant influence on the expression of job satisfaction; voluntariness and administration mode showed tendencies to moderate these influences. Effects of anonymity were shown to be stronger for participants who chose to reveal even less information about themselves by leaving questions with identifying content unanswered. Results will be discussed in terms of the strategic aspect of the Social Identity model of Deindividuation Effects (SIDE). Methods and applications used in developing and administering the employee survey and in analyzing the data will be presented. Recommendations will be made regarding the increase of response rates, the avoidance of social desirability in survey answering behavior, and the organization of Internet-based employee surveys.
Das Internet kann für Mitarbeiterbefragungen eingesetzt werden, aber in vielen Unternehmen stellt sich die Frage, ob sein Einsatz einfach aufgrund dieser neuen Technologie oder der psychologischen Aspekte im Umgang mit der Technologie zu anderen Ergebnissen führen kann. In einem 2x2x2x2 faktoriellen (quasi-)experimentellen Design wurde untersucht, ob Anonymität, Freiwilligkeit, Alter, und Befragungsmodus (Internet-gestützt versus Papier-und-Bleistift) das Antwortverhalten in einer neu entwickelten kurzformatigen Mitarbeiterbefragung beeinflussen würde. Alle Mitarbeiter einer Arbeitseinheit (N=655) eines grossen Schweizer Unternehmens wurden zufällig den Versuchsbedingungen zugeteilt und zur Teilnahme an der Befragung eingeladen. Im Ganzen antworteten 318 Beschäftigte, es zeigten sich geringere Response-Raten in den anonymen und freiwilligen Bedingungen: anonym & freiwillig 31% Response vs. wenig anonym & eingeschränkt freiwillig 58% Response, die anderen Versuchsbedingungen lagen dazwischen. Es zeigte sich auch eine Tendenz zu unterschiedlichem Teilnahmeabbruch. Freiwilligkeit hatte einen signifikanten Effekt auf die Antwortzeiten, einem abhängigen Mass, das nur in der Internet-Bedingung erhoben werden konnte. Die Befragungsergebnisse zeigen Effekte von Anonymität und Alter auf Arbeitszufriedenheitsäusserungen, die tendenziell durch Freiwilligkeit und Befragungsmodus moderiert wurden. Einflüsse der Anonymität erwiesen sich stärker bei Teilnehmern, die noch weniger Informationen über sich preisgaben, idem sie Fragen mit identifizierendem Inhalt unbeantwortet liessen. Diese Ergebnisse werden im Hinblick auf das Social Identity model of Deindividuation Effects (SIDE) diskutiert. Methoden und Anwendungen, die in der Entwicklung und Durchführung der Mitarbeiterbefragung und bei der Datenanalyse benutzt wurden, werden vorgestellt. Es werden Empfehlungen zur Erhöhung der Response-Raten, der Vermeidung von sozial erwünschtem Antwortverhalten und zur Durchführung von Internet-basierten Mitarbeiterbefragungen gemacht.
Homepage - conference (abstract)
Web survey bibliography (4086)
- Displaying Videos in Web Surveys: Implications for Complete Viewing and Survey Responses; 2017; Mendelson, J.; Lee Gibson, J.; Romano Bergstrom, J. C.
- Using experts’ consensus (the Delphi method) to evaluate weighting techniques in web surveys not...; 2017; Toepoel, V.; Emerson, H.
- Mind the Mode: Differences in Paper vs. Web-Based Survey Modes Among Women With Cancer; 2017; Hagan, T. L.; Belcher, S. M.; Donovan, H. S.
- Answering Without Reading: IMCs and Strong Satisficing in Online Surveys; 2017; Anduiza, E.; Galais, C.
- Ideal and maximum length for a web survey; 2017; Revilla, M.; Ochoa, C.
- Social desirability bias in self-reported well-being measures: evidence from an online survey; 2017; Caputo, A.
- Web-Based Survey Methodology; 2017; Wright, K. B.
- Handbook of Research Methods in Health Social Sciences; 2017; Liamputtong, P.
- Lessons from recruitment to an internet based survey for Degenerative Cervical Myelopathy: merits of...; 2017; Davies, B.; Kotter, M. R.
- Web Survey Gamification - Increasing Data Quality in Web Surveys by Using Game Design Elements; 2017; Schacht, S.; Keusch, F.; Bergmann, N.; Morana, S.
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Comparability of web and telephone surveys for the measurement of subjective well-being; 2017; Sarracino, F.; Riillo, C. F. A.; Mikucka, M.
- Achieving Strong Privacy in Online Survey; 2017; Zhou, Yo.; Zhou, Yi.; Chen, S.; Wu, S. S.
- A Meta-Analysis of the Effects of Incentives on Response Rate in Online Survey Studies; 2017; Mohammad Asire, A.
- Telephone versus Online Survey Modes for Election Studies: Comparing Canadian Public Opinion and Vote...; 2017; Breton, C.; Cutler, F.; Lachance, S.; Mierke-Zatwarnicki, A.
- Examining Factors Impacting Online Survey Response Ratesin Educational Research: Perceptions of Graduate...; 2017; Saleh, A.; Bista, K.
- Usability Testing for Survey Research; 2017; Geisen, E.; Romano Bergstrom, J. C.
- Paradata as an aide to questionnaire design: Improving quality and reducing burden; 2017; Timm, E.; Stewart, J.; Sidney, I.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Millennials and emojis in Spain and Mexico.; 2017; Bosch Jover, O.; Revilla, M.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Measuring Subjective Health and Life Satisfaction with U.S. Hispanics; 2017; Lee, S.; Davis, R.
- Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes; 2017; Jablonski, W.; Grzeszkiewicz-Radulska, K.; Krzewinska, A.
- A Comparison of Emerging Pretesting Methods for Evaluating “Modern” Surveys; 2017; Geisen, E., Murphy, J.
- The Effect of Respondent Commitment on Response Quality in Two Online Surveys; 2017; Cibelli Hibben, K.
- Pushing to web in the ISSP; 2017; Jonsdottir, G. A.; Dofradottir, A. G.; Einarsson, H. B.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- Redirected Inbound Call Sampling (RICS); A New Methodology ; 2017; Krotki, K.; Bobashev, G.; Levine, B.; Richards, S.
- An Empirical Process for Using Non-probability Survey for Inference; 2017; Tortora, R.; Iachan, R.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- A Comparison of Two Nonprobability Samples with Probability Samples; 2017; Zack, E. S.; Kennedy, J. M.
- Rates, Delays, and Completeness of General Practitioners’ Responses to a Postal Versus Web-Based...; 2017; Sebo, P.; Maisonneuve, H.; Cerutti, B.; Pascal Fournier, J.; Haller, D. M.
- Necessary but Insufficient: Why Measurement Invariance Tests Need Online Probing as a Complementary...; 2017; Meitinger, K.
- Nonresponse in Organizational Surveying: Attitudinal Distribution Form and Conditional Response Probabilities...; 2017; Kulas, J. T.; Robinson, D. H.; Kellar, D. Z.; Smith, J. A.
- Theory and Practice in Nonprobability Surveys: Parallels between Causal Inference and Survey Inference...; 2017; Mercer, A. W.; Kreuter, F.; Keeter, S.; Stuart, E. A.
- Is There a Future for Surveys; 2017; Miller, P. V.
- Reducing speeding in web surveys by providing immediate feedback; 2017; Conrad, F.; Tourangeau, R.; Couper, M. P.; Zhang, C.
- Social Desirability and Undesirability Effects on Survey Response latencies; 2017; Andersen, H.; Mayerl, J.
- A Working Example of How to Use Artificial Intelligence To Automate and Transform Surveys Into Customer...; 2017; Neve, S.
- A Case Study on Evaluating the Relevance of Some Rules for Writing Requirements through an Online Survey...; 2017; Warnier, M.; Condamines, A.
- Estimating the Impact of Measurement Differences Introduced by Efforts to Reach a Balanced Response...; 2017; Kappelhof, J. W. S.; De Leeuw, E. D.
- Targeted letters: Effects on sample composition and item non-response; 2017; Bianchi, A.; Biffignandi, S.